211 research outputs found

    Towards Provenance Cloud Security Auditing Based on Association Rule Mining

    Get PDF
    Cloud storage provides external data storage services by combining and coordinating different types of devices in a network to work collectively. However, there is always a trust relationship between users and service providers, therefore, an effective security auditing of cloud data and operational processes is necessary. We propose a trusted cloud framework based on a Cloud Accountability Life Cycle (CALC). We suggest that auditing provenance data in cloud servers is a practical and efficient method to log data, being relatively stable and easy to collect type of provenance data. Furthermore, we suggest a scheme based on user behaviour (UB) by analysing the log data from cloud servers. We present a description of rules for a UB operating system log, and put forward an association rule mining algorithm based on the Long Sequence Frequent Pattern (LSFP) to extract the UB. Finally, the results of our experiment prove that our solution can be implemented to track and forensically inspect the data leakage in an efficient manner for cloud security auditing

    Development and validation of a composite score for excessive alcohol use screening

    Get PDF
    This study was undertaken to develop a composite measure that combines the discriminant values of individual laboratory markers routinely used for excessive alcohol use (EAU) for an improved screening performance. The training sample consisted of 272 individuals with known history of EAU and 210 non-alcoholic individuals. The validation sample included 100 EAU and 75 controls. We used the estimated regression coefficients and the observed marker values to calculate the individual's composite screening score; this score was converted to a probability measure for excessive drinking in the given individual. A threshold value for the screening score based on an examination of the estimated sensitivity and specificity associated with different threshold values was proposed. Using regression coefficients estimated from the training sample, a composite score based on the levels of aspartate aminotransferase, alanine aminotransferase, per cent carbohydrate-deficient transferrin and mean corpuscular volume was calculated. The areas under the receiver operating characteristic curve (AUC) value of the selected model was 0.87, indicating a strong discriminating power and the AUC was better than that of each individual test. The score >0.23 corresponded to a sensitivity of 90% and a specificity of nearly 60%. The AUC value remained at a respectable level of 0.83 with the sensitivity and specificity at 91% and 49%, respectively, in the validation sample. We developed a novel composite score by using a combination of commonly used biomakers. However, the development of the mechanism-based biomarkers of EAU is needed to improve the screening and diagnosis of EAU in clinical practice

    Multi-hole simultaneous drilling of aluminium alloy: a preliminary study and evaluation against one-shot drilling process

    Get PDF
    Poly-drill heads are used in mass production to increase productivity when a large number of holes are required. In this work, drilling experiments on Al5083 aluminium alloy were carried out using a poly-drill head to measure the thrust force and assess hole quality. Analysis of chip formations and post-machining tool condition were evaluated using optical microscopy. Additional drilling tests were conducted using one-shot drilling and results obtained from the two drilling techniques were evaluated against each other. The results showed that the average thrust forces obtained from poly-drill head were slightly lower than those from one-shot drilling. Improvement in hole quality in terms of surface roughness and reduction in chip length were achieved using the poly-drill head. Furthermore, visual inspection of the tools showed that adhesion and built-up edges on drills used in the poly-drill head were lower as compared to drills used in the one-shot drilling. The contribution of input parameters on the measured outputs was determined using an ANOVA statistical tool

    Ambient conditions disordered-ordered phase transition of two-dimensional interfacial water molecules dependent on charge dipole moment

    Get PDF
    Phase transitions of water molecules are commonly expected to occur only under extreme conditions, such as nanoconfinement, high pressure, or low temperature. We herein report the disordered-ordered phase transition of two-dimensional interfacial water molecules under ambient conditions using molecular-dynamics simulations. This phase transition is greatly dependent on the charge dipole moment, production of both charge values, and the dipole length of the solid surface. The phase transition can be identified by a sharp change in water-water interaction energies and the order parameters of the two-dimensional interfacial water monolayer, under a tiny dipole moment change near the critical dipole moment. The critical dipole moment of the solid material surface can classify a series of materials that can induce distinct ordered phases of surface water, which may also result in surface wetting, friction, and other properties

    Optimization and modeling of process parameters in multi-hole simultaneous drilling using taguchi method and fuzzy logic approach

    Get PDF
    In industries such as aerospace and automotive, drilling many holes is commonly required to assemble different structures where machined holes need to comply with tight geometric tolerances. Multi-spindle drilling using a poly-drill head is an industrial hole-making approach that allows drilling several holes simultaneously. Optimizing process parameters also improves machining processes. This work focuses on the optimization of drilling parameters and two drilling processes-namely, one-shot drilling and multi-hole drilling-using the Taguchi method. Analysis of variance and regression analysis was implemented to indicate the significance of drilling parameters and their impact on the measured responses i.e., surface roughness and hole size. From the Taguchi optimization, optimal drilling parameters were found to occur at a low cutting speed and feed rate using a poly-drill head. Furthermore, a fuzzy logic approach was employed to predict the surface roughness and hole size. It was found that the fuzzy measured values were in good agreement with the experimental values; therefore, the developed models can be effectively used to predict the surface roughness and hole size in multi-hole drilling. Moreover, confirmation tests were performed to validate that the Taguchi optimized levels and fuzzy developed models effectively represent the surface roughness and hole size

    A framework of lightweight deep cross-connected convolution kernel mapping support vector machines

    Get PDF
    Deep kernel mapping support vector machines have achieved good results in numerous tasks by mapping features from a low-dimensional space to a high-dimensional space and then using support vector machines for classification. However, the depth kernel mapping support vector machine does not take into account the connection of different dimensional spaces and increases the model parameters. To further improve the recognition capability of deep kernel mapping support vector machines while reducing the number of model parameters, this paper proposes a framework of Lightweight Deep Convolutional Cross-Connected Kernel Mapping Support Vector Machines (LC-CKMSVM). The framework consists of a feature extraction module and a classification module. The feature extraction module first maps the data from low-dimensional to high-dimensional space by fusing the representations of different dimensional spaces through cross-connections; then, it uses depthwise separable convolution to replace part of the original convolution to reduce the number of parameters in the module; The classification module uses a soft margin support vector machine for classification. The results on 6 different visual datasets show that LC-CKMSVM obtains better classification accuracies on most cases than the other five models

    Physical layer authentication using ensemble learning technique in wireless communications

    Get PDF
    Cyber-physical wireless systems have surfaced as an important data communication and networking research area. It is an emerging discipline that allows effective monitoring and efficient real-time communication between the cyber and physical worlds by embedding computer software and integrating communication and networking technologies. Due to their high reliability, sensitivity and connectivity, their security requirements are more comparable to the Internet as they are prone to various security threats such as eavesdropping, spoofing, botnets, man-in-the-middle attack, denial of service (DoS) and distributed denial of service (DDoS) and impersonation. Existing methods use physical layer authentication (PLA), the most promising solution to detect cyber-attacks. Still, the cyber-physical systems (CPS) have relatively large computational requirements and require more communication resources, thus making it impossible to achieve a low latency target. These methods perform well but only in stationary scenarios. We have extracted the relevant features from the channel matrices using discrete wavelet transformation to improve the computational time required for data processing by considering mobile scenarios. The features are fed to ensemble learning algorithms, such as AdaBoost, LogitBoost and Gentle Boost, to classify data. The authentication of the received signal is considered a binary classification problem. The transmitted data is labeled as legitimate information, and spoofing data is illegitimate information. Therefore, this paper proposes a threshold-free PLA approach that uses machine learning algorithms to protect critical data from spoofing attacks. It detects the malicious data packets in stationary scenarios and detects them with high accuracy when receivers are mobile. The proposed model achieves better performance than the existing approaches in terms of accuracy and computational time by decreasing the processing time

    Intrusion detection based on bidirectional long short-term memory with attention mechanism

    Get PDF
    With the recent developments in the Internet of Things (IoT), the amount of data collected has expanded tremendously, resulting in a higher demand for data storage, computational capacity, and real-time processing capabilities. Cloud computing has traditionally played an important role in establishing IoT. However, fog computing has recently emerged as a new field complementing cloud computing due to its enhanced mobility, location awareness, heterogeneity, scalability, low latency, and geographic distribution. However, IoT networks are vulnerable to unwanted assaults because of their open and shared nature. As a result, various fog computing-based security models that protect IoT networks have been developed. A distributed architecture based on an intrusion detection system (IDS) ensures that a dynamic, scalable IoT environment with the ability to disperse centralized tasks to local fog nodes and which successfully detects advanced malicious threats is available. In this study, we examined the time-related aspects of network traffic data. We presented an intrusion detection model based on a two-layered bidirectional long short-term memory (Bi-LSTM) with an attention mechanism for traffic data classification verified on the UNSW-NB15 benchmark dataset. We showed that the suggested model outperformed numerous leading-edge Network IDS that used machine learning models in terms of accuracy, precision, recall and F1 score
    corecore